feat: surface GLM-backed Codex and Claude runtimes#1823
feat: surface GLM-backed Codex and Claude runtimes#1823Marve10s wants to merge 5 commits intopingdotgg:mainfrom
Conversation
Add GLM as a Codex-backed provider that routes through a local Responses-to-ChatCompletions bridge. GLM sessions reuse the Codex app-server runtime while presenting as a separate provider in the UI. Contracts: add "glm" to ProviderKind, ModelSelection, GlmSettings, and all Record<ProviderKind, ...> exhaustiveness sites. Server: GlmAdapter (thin CodexAdapter delegate), GlmProvider (snapshot service checking GLM_API_KEY), GLM bridge (loopback HTTP translating Responses <-> Chat Completions), shared codexLaunchConfig builder, text generation routing for GLM. Web: GLM in provider picker, settings panel with env var hint, composer registry entry, model selection config, GlmIcon. Tests: 40 new tests covering bridge translation (Responses -> Chat Completions, Chat Completions streaming -> Responses SSE), launch config builder, and updated existing tests for the new provider.
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: Repository UI Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
Yes, I have lot of motivation to integrate GLM plans into Codex ( they work with CC as well btw ). |
- makeGlmAdapterLive now accepts and plumbs options parameter - Documented that GLM runtime events flow through the Codex adapter stream with provider="codex" — event re-attribution by ProviderService based on session directory is a follow-up
|
Addressed both review comments in aaeaf5b: 1. 2. Empty The correct follow-up is to add event re-attribution in |
I've been using my GLM coding plan with Claude in t3code. |
There was a problem hiding this comment.
🟠 High
t3code/apps/server/src/provider/Layers/CodexProvider.ts
Lines 538 to 552 in 7cf4cc3
The buildServerProvider call for unsupported CLI versions (lines 539-552) omits displayName, so users with a GLM model provider see the generic name "Codex" instead of "Codex / GLM" in the error state. This is inconsistent with all other buildServerProvider calls in this function, which include displayName. Consider adding displayName to this call to ensure the provider name is consistent across all status scenarios.
- if (parsedVersion && !isCodexCliVersionSupported(parsedVersion)) {
+ if (parsedVersion && !isCodexCliVersionSupported(parsedVersion)) {
return buildServerProvider({
provider: PROVIDER,
enabled: codexSettings.enabled,
checkedAt,
models,
+ displayName,
probe: {
installed: true,
version: parsedVersion,
status: "error",
auth: { status: "unknown" },
message: formatCodexCliUpgradeMessage(parsedVersion),
},
});
}🤖 Copy this AI Prompt to have your agent fix this:
In file apps/server/src/provider/Layers/CodexProvider.ts around lines 538-552:
The `buildServerProvider` call for unsupported CLI versions (lines 539-552) omits `displayName`, so users with a GLM model provider see the generic name "Codex" instead of "Codex / GLM" in the error state. This is inconsistent with all other `buildServerProvider` calls in this function, which include `displayName`. Consider adding `displayName` to this call to ensure the provider name is consistent across all status scenarios.
Evidence trail:
apps/server/src/provider/Layers/CodexProvider.ts:369-377 (codexDisplayName function showing 'Codex / GLM' for GLM providers), apps/server/src/provider/Layers/CodexProvider.ts:447 (displayName computed), apps/server/src/provider/Layers/CodexProvider.ts:539-551 (buildServerProvider call for unsupported CLI version - missing displayName), apps/server/src/provider/Layers/CodexProvider.ts:456-468,479-494,497-509,519-535,554-567,584-601,603-617,621-638 (other buildServerProvider calls that all include displayName)
Summary
What changed
CodexProvidernow readsmodel_providerfrom Codex config, showsCodex / GLMwhen applicable, skips OpenAI login checks for custom backends, and swaps built-in suggestions to GLM modelsClaudeProvidernow detects Z.ai's Anthropic-compatible Claude setup from env or~/.claude/settings.json, showsClaude / GLM, and maps Claude tiers to configured GLM modelsdisplayName, and the web picker, banners, chat labels, and settings cards use that live runtime/backend labelglmprovider, GLM bridge, GLM adapter, GLM settings surface, and related routing/contracts that implied a third runtimeUser impact
Codex / GLMand GLM model suggestions instead of generic Codex/OpenAI labelingClaude / GLMand the mapped GLM models instead of a misleading Claude-only surfaceValidation
bun fmtbun lintbun typecheckcd apps/server && bun run test src/provider/Layers/ProviderRegistry.test.ts src/provider/Layers/ProviderAdapterRegistry.test.tsHOME,CODEX_HOME, andT3CODE_HOMEplus fake GLM-backed configs for Codex and Claude